84 research outputs found

    Data-adaptive harmonic spectra and multilayer Stuart-Landau models

    Full text link
    Harmonic decompositions of multivariate time series are considered for which we adopt an integral operator approach with periodic semigroup kernels. Spectral decomposition theorems are derived that cover the important cases of two-time statistics drawn from a mixing invariant measure. The corresponding eigenvalues can be grouped per Fourier frequency, and are actually given, at each frequency, as the singular values of a cross-spectral matrix depending on the data. These eigenvalues obey furthermore a variational principle that allows us to define naturally a multidimensional power spectrum. The eigenmodes, as far as they are concerned, exhibit a data-adaptive character manifested in their phase which allows us in turn to define a multidimensional phase spectrum. The resulting data-adaptive harmonic (DAH) modes allow for reducing the data-driven modeling effort to elemental models stacked per frequency, only coupled at different frequencies by the same noise realization. In particular, the DAH decomposition extracts time-dependent coefficients stacked by Fourier frequency which can be efficiently modeled---provided the decay of temporal correlations is sufficiently well-resolved---within a class of multilayer stochastic models (MSMs) tailored here on stochastic Stuart-Landau oscillators. Applications to the Lorenz 96 model and to a stochastic heat equation driven by a space-time white noise, are considered. In both cases, the DAH decomposition allows for an extraction of spatio-temporal modes revealing key features of the dynamics in the embedded phase space. The multilayer Stuart-Landau models (MSLMs) are shown to successfully model the typical patterns of the corresponding time-evolving fields, as well as their statistics of occurrence.Comment: 26 pages, double columns; 15 figure

    On quantifying the climate of the nonautonomous lorenz-63 model

    Get PDF
    The Lorenz-63 model has been frequently used to inform our understanding of the Earth's climate and provide insight for numerical weather and climate prediction. Most studies have focused on the autonomous (time invariant) model behaviour in which the model's parameters are constants. Here we investigate the properties of the model under time-varying parameters, providing a closer parallel to the challenges of climate prediction, in which climate forcing varies with time. Initial condition (IC) ensembles are used to construct frequency distributions of model variables and we interpret these distributions as the time-dependent climate of the model. Results are presented that demonstrate the impact of ICs on the transient behaviour of the model climate. The location in state space from which an IC ensemble is initiated is shown to significantly impact the time it takes for ensembles to converge. The implication for climate prediction is that the climate may, in parallel with weather forecasting, have states from which its future behaviour is more, or less, predictable in distribution. Evidence of resonant behaviour and path dependence is found in model distributions under time varying parameters, demonstrating that prediction in nonautonomous nonlinear systems can be sensitive to the details of time-dependent forcing/parameter variations. Single model realisations are shown to be unable to reliably represent the model's climate; a result which has implications for how real-world climatic timeseries from observation are interpreted. The results have significant implications for the design and interpretation of Global Climate Model experiments. Over the past 50 years, insight from research exploring the behaviour of simple nonlinear systems has been fundamental in developing approaches to weather and climate prediction. The analysis herein utilises the much studied Lorenz-63 model to understand the potential behaviour of nonlinear systems, such as the 5 climate, when subject to time-varying external forcing, such as variations in atmospheric greenhouse gases or solar output. Our primary aim is to provide insight which can guide new approaches to climate model experimental design and thereby better address the uncertainties associated with climate change prediction. We use ensembles of simulations to generate distributions which 10 we refer to as the \climate" of the time-variant Lorenz-63 model. In these ensemble experiments a model parameter is varied in a number of ways which can be seen as paralleling both idealised and realistic variations in external forcing of the real climate system. Our results demonstrate that predictability of climate distributions under time varying forcing can be highly sensitive to 15 the specification of initial states in ensemble simulations. This is a result which at a superficial level is similar to the well-known initial condition sensitivity in weather forecasting, but with different origins and different implications for ensemble design. We also demonstrate the existence of resonant behaviour and a dependence on the details of the \forcing" trajectory, thereby highlighting 20 further aspects of nonlinear system behaviour with important implications for climate prediction. Taken together, our results imply that current approaches to climate modeling may be at risk of under-sampling key uncertainties likely to be significant in predicting future climate

    Statistical and dynamical properties of covariant lyapunov vectors in a coupled atmosphere-ocean model—multiscale effects, geometric degeneracy, and error dynamics

    Get PDF
    We study a simplified coupled atmosphere-ocean model using the formalism of covariant Lyapunov vectors (CLVs), which link physically-based directions of perturbations to growth/decay rates. The model is obtained via a severe truncation of quasi-geostrophic equations for the two fluids, and includes a simple yet physically meaningful representation of their dynamical/thermodynamical coupling. The model has 36 degrees of freedom, and the parameters are chosen so that a chaotic behaviour is observed. There are two positive Lyapunov exponents (LEs), sixteen negative LEs, and eighteen near-zero LEs. The presence of many near-zero LEs results from the vast time-scale separation between the characteristic time scales of the two fluids, and leads to nontrivial error growth properties in the tangent space spanned by the corresponding CLVs, which are geometrically very degenerate. Such CLVs correspond to two different classes of ocean/atmosphere coupled modes. The tangent space spanned by the CLVs corresponding to the positive and negative LEs has, instead, a non-pathological behaviour, and one can construct robust large deviations laws for the finite time LEs, thus providing a universal model for assessing predictability on long to ultra-long scales along such directions. Interestingly, the tangent space of the unstable manifold has substantial projection on both atmospheric and oceanic components. The results show the difficulties in using hyperbolicity as a conceptual framework for multiscale chaotic dynamical systems, whereas the framework of partial hyperbolicity seems better suited, possibly indicating an alternative definition for the chaotic hypothesis. They also suggest the need for an accurate analysis of error dynamics on different time scales and domains and for a careful set-up of assimilation schemes when looking at coupled atmosphere-ocean models

    Effects of stochastic parametrization on extreme value statistics

    Get PDF
    Extreme geophysical events are of crucial relevance to our daily life: they threaten human lives and cause property damage. To assess the risk and reduce losses, we need to model and probabilistically predict these events. Parametrizations are computational tools used in the Earth system models, which are aimed at reproducing the impact of unresolved scales on resolved scales. The performance of parametrizations has usually been examined on typical events rather than on extreme events. In this paper, we consider a modified version of the two-level Lorenz’96 model and investigate how two parametrizations of the fast degrees of freedom perform in terms of the representation of extreme events. One parametrization is constructed following Wilks [Q. J. R. Meteorol. Soc. 131, 389–407 (2005)] and is constructed through an empirical fitting procedure; the other parametrization is constructed through the statistical mechanical approach proposed by Wouters and Lucarini [J. Stat. Mech. Theory Exp. 2012, P03003 (2012); J. Stat. Phys. 151, 850–860 (2013)]. The two strategies show different advantages and disadvantages. We discover that the agreement between parametrized models and true model is in general worse when looking at extremes rather than at the bulk of the statistics. The results suggest that stochastic parametrizations should be accurately and specifically tested against their performance on extreme events, as usual optimization procedures might neglect them. The provision of accurate parametrizations is a task of paramount importance in many scientific areas and specifically in weather and climate modeling. Parametrizations are needed for representing accurately and efficiently the impact of the scales of motions and of the processes that cannot be explicitly represented by the numerical model. Parametrizations are usually constructed in order to optimize the overall performance of the model, thus aiming at an accurate representation of the bulk of the statistics. Nonetheless, numerical models are key to estimating, anticipating, and predicting extreme events. Here, we analyze critically in a simple yet illustrative example the performance of parametrizations in describing extreme events, and we conclude that good performance on typical conditions cannot be in any way extrapolated for rare conditions, which could, nonetheless, be of great practical relevance

    Response formulae for n-point correlations in statistical mechanical systems and application to a problem of coarse graining

    Get PDF
    Predicting the response of a system to perturbations is a key challenge in mathematical and natural sciences. Under suitable conditions on the nature of the system, of the perturbation, and of the observables of interest, response theories allow to construct operators describing the smooth change of the invariant measure of the system of interest as a function of the small parameter controlling the intensity of the perturbation. In particular, response theories can be developed both for stochastic and chaotic deterministic dynamical systems, where in the latter case stricter conditions imposing some degree of structural stability are required. In this paper we extend previous findings and derive general response formulae describing how n-point correlations are affected by perturbations to the vector flow. We also show how to compute the response of the spectral properties of the system to perturbations. We then apply our results to the seemingly unrelated problem of coarse graining in multiscale systems: we find explicit formulae describing the change in the terms describing parameterisation of the neglected degrees of freedom resulting from applying perturbations to the full system. All the terms envisioned by the Mori-Zwanzig theory - the deterministic, stochastic, and non-Markovian terms - are affected at 1st order in the perturbation. The obtained results provide a more comprehesive understanding of the response of statistical mechanical systems to perturbations and contribute to the goal of constructing accurate and robust parameterisations and are of potential relevance for fields like molecular dynamics, condensed matter, and geophysical fluid dynamics. We envision possible applications of our general results to the study of the response of climate variability to anthropogenic and natural forcing and to the study of the equivalence of thermostatted statistical mechanical systems

    Dimension reduction for systems with slow relaxation

    Full text link
    We develop reduced, stochastic models for high dimensional, dissipative dynamical systems that relax very slowly to equilibrium and can encode long term memory. We present a variety of empirical and first principles approaches for model reduction, and build a mathematical framework for analyzing the reduced models. We introduce the notions of universal and asymptotic filters to characterize `optimal' model reductions for sloppy linear models. We illustrate our methods by applying them to the practically important problem of modeling evaporation in oil spills.Comment: 48 Pages, 13 figures. Paper dedicated to the memory of Leo Kadanof

    DADA: data assimilation for the detection and attribution of weather and climate-related events

    Get PDF
    A new nudging method for data assimilation, delay‐coordinate nudging, is presented. Delay‐coordinate nudging makes explicit use of present and past observations in the formulation of the forcing driving the model evolution at each time step. Numerical experiments with a low‐order chaotic system show that the new method systematically outperforms standard nudging in different model and observational scenarios, also when using an unoptimized formulation of the delay‐nudging coefficients. A connection between the optimal delay and the dominant Lyapunov exponent of the dynamics is found based on heuristic arguments and is confirmed by the numerical results, providing a guideline for the practical implementation of the algorithm. Delay‐coordinate nudging preserves the easiness of implementation, the intuitive functioning and the reduced computational cost of the standard nudging, making it a potential alternative especially in the field of seasonal‐to‐decadal predictions with large Earth system models that limit the use of more sophisticated data assimilation procedures

    The writing on the wall: the concealed communities of the East Yorkshire horselads

    Get PDF
    This paper examines the graffiti found within late nineteenth and early-twentieth century farm buildings in the Wolds of East Yorkshire. It suggests that the graffiti were created by a group of young men at the bottom of the social hierarchy - the horselads – and was one of the ways in which they constructed a distinctive sense of communal identity, at a particular stage in their lives. Whilst it tells us much about changing agricultural regimes and social structures, it also informs us about experiences and attitudes often hidden from official histories and biographies. In this way, the graffiti are argued to inform our understanding, not only of a concealed community, but also about their hidden histor

    Predicting climate change using response theory: global averages and spatial patterns

    Get PDF
    The provision of accurate methods for predicting the climate response to anthropogenic and natural forcings is a key contemporary scientific challenge. Using a simplified and efficient open-source general circulation model of the atmosphere featuring O(105105) degrees of freedom, we show how it is possible to approach such a problem using nonequilibrium statistical mechanics. Response theory allows one to practically compute the time-dependent measure supported on the pullback attractor of the climate system, whose dynamics is non-autonomous as a result of time-dependent forcings. We propose a simple yet efficient method for predicting—at any lead time and in an ensemble sense—the change in climate properties resulting from increase in the concentration of CO22 using test perturbation model runs. We assess strengths and limitations of the response theory in predicting the changes in the globally averaged values of surface temperature and of the yearly total precipitation, as well as in their spatial patterns. The quality of the predictions obtained for the surface temperature fields is rather good, while in the case of precipitation a good skill is observed only for the global average. We also show how it is possible to define accurately concepts like the inertia of the climate system or to predict when climate change is detectable given a scenario of forcing. Our analysis can be extended for dealing with more complex portfolios of forcings and can be adapted to treat, in principle, any climate observable. Our conclusion is that climate change is indeed a problem that can be effectively seen through a statistical mechanical lens, and that there is great potential for optimizing the current coordinated modelling exercises run for the preparation of the subsequent reports of the Intergovernmental Panel for Climate Change
    • 

    corecore